كلما زادت طلبات التقديم التي ترسلينها، زادت فرصك في الحصول على وظيفة!
إليك لمحة عن معدل نشاط الباحثات عن عمل خلال الشهر الماضي:
عدد الفرص التي تم تصفحها
عدد الطلبات التي تم تقديمها
استمري في التصفح والتقديم لزيادة فرصك في الحصول على وظيفة!
هل تبحثين عن جهات توظيف لها سجل مثبت في دعم وتمكين النساء؟
اضغطي هنا لاكتشاف الفرص المتاحة الآن!ندعوكِ للمشاركة في استطلاع مصمّم لمساعدة الباحثين على فهم أفضل الطرق لربط الباحثات عن عمل بالوظائف التي يبحثن عنها.
هل ترغبين في المشاركة؟
في حال تم اختياركِ، سنتواصل معكِ عبر البريد الإلكتروني لتزويدكِ بالتفاصيل والتعليمات الخاصة بالمشاركة.
ستحصلين على مبلغ 7 دولارات مقابل إجابتك على الاستطلاع.
Aspire Software is looking for a Data Engineer to join our team in Lebanon.
Here is a little window into our company: Aspire Software operates and manages wholly owned software companies, providing mission-critical solutions across multiple verticals.
By implementing industry best practices, Aspire delivers a time sensitive integration process, and the operation of a decentralized model has allowed it to become a hub for creating rapid growth by reinvesting in its portfolio.
About the Role : Valsoft Corporation is seeking a skilled Data Engineer to join our Finance & Acquisition Data and Reporting team.
In this role, you will design, build, and maintain scalable data pipelines and analytics infrastructure that support financial reporting, acquisition analytics, forecasting, and executive decision-making across Valsoft’s portfolio of companies.
You will work closely with Finance, M&A, Reporting, and Engineering stakeholders to deliver reliable, high-quality data solutions.
The role involves owning data pipelines end-to-end, improving data quality and performance, and translating complex business requirements into well-designed data models and workflows.
This position is well suited for a data engineer with 4+ years of experience who is comfortable working with production data systems, enjoys solving data reliability and scalability challenges, and wants to make a direct impact on financial and strategic outcomes.
Key Responsibilities Design, build, and maintain scalable and reliable ETL/ELT pipelines Own and optimize Snowflake data models using dbt, including testing and documentation Ingest and manage data from multiple sources using Fivetran and/or Stitch Orchestrate, monitor, and troubleshoot workflows using Apache Airflow Write high-performance, production-grade SQL and Python code Implement data quality checks, monitoring, and performance optimizations Build and maintain API-based integrations between applications and the data warehouse Work with AWS services (S3, Lambda, IAM, API Gateway, etc.
) to support data workflows Partner with Finance and M&A stakeholders to deliver analytics, reporting, and forecasting solutions Support BI tools such as Power BI (preferred) or Tableau Contribute to best practices around version control, CI/CD, testing, and deployment Mentor junior team members and contribute to improving team standards and documentation Participate in architecture discussions and continuous improvement initiatives 4+ years of professional experience as a Data Engineer or similar role Strong hands-on experience with: Snowflake dbt AWS (S3, IAM, Lambda, API Gateway, or similar) Apache Airflow Fivetran and/or Stitch Advanced proficiency in SQL Strong working experience with Python Experience supporting production-grade data systems Solid understanding of: Data warehousing concepts and dimensional modeling ETL/ELT design patterns Structured and semi-structured data (JSON, Parquet, XML, etc.
) Experience working with BI tools (Power BI preferred, Tableau acceptable) Preferred Skills and Qualifications: Experience in Finance, Accounting, or M&A analytics Exposure to forecasting, budgeting, or financial reporting data Familiarity with data governance, security, and access controls Experience preparing data for AI / ML use cases Background in application development or API design
لن يتم النظر في طلبك لهذة الوظيفة، وسيتم إزالته من البريد الوارد الخاص بصاحب العمل.